4,535 research outputs found

    Decidable Reasoning in Terminological Knowledge Representation Systems

    Get PDF
    Terminological knowledge representation systems (TKRSs) are tools for designing and using knowledge bases that make use of terminological languages (or concept languages). We analyze from a theoretical point of view a TKRS whose capabilities go beyond the ones of presently available TKRSs. The new features studied, often required in practical applications, can be summarized in three main points. First, we consider a highly expressive terminological language, called ALCNR, including general complements of concepts, number restrictions and role conjunction. Second, we allow to express inclusion statements between general concepts, and terminological cycles as a particular case. Third, we prove the decidability of a number of desirable TKRS-deduction services (like satisfiability, subsumption and instance checking) through a sound, complete and terminating calculus for reasoning in ALCNR-knowledge bases. Our calculus extends the general technique of constraint systems. As a byproduct of the proof, we get also the result that inclusion statements in ALCNR can be simulated by terminological cycles, if descriptive semantics is adopted.Comment: See http://www.jair.org/ for any accompanying file

    Structured Knowledge Representation for Image Retrieval

    Full text link
    We propose a structured approach to the problem of retrieval of images by content and present a description logic that has been devised for the semantic indexing and retrieval of images containing complex objects. As other approaches do, we start from low-level features extracted with image analysis to detect and characterize regions in an image. However, in contrast with feature-based approaches, we provide a syntax to describe segmented regions as basic objects and complex objects as compositions of basic ones. Then we introduce a companion extensional semantics for defining reasoning services, such as retrieval, classification, and subsumption. These services can be used for both exact and approximate matching, using similarity measures. Using our logical approach as a formal specification, we implemented a complete client-server image retrieval system, which allows a user to pose both queries by sketch and queries by example. A set of experiments has been carried out on a testbed of images to assess the retrieval capabilities of the system in comparison with expert users ranking. Results are presented adopting a well-established measure of quality borrowed from textual information retrieval

    Space Efficiency of Propositional Knowledge Representation Formalisms

    Full text link
    We investigate the space efficiency of a Propositional Knowledge Representation (PKR) formalism. Intuitively, the space efficiency of a formalism F in representing a certain piece of knowledge A, is the size of the shortest formula of F that represents A. In this paper we assume that knowledge is either a set of propositional interpretations (models) or a set of propositional formulae (theorems). We provide a formal way of talking about the relative ability of PKR formalisms to compactly represent a set of models or a set of theorems. We introduce two new compactness measures, the corresponding classes, and show that the relative space efficiency of a PKR formalism in representing models/theorems is directly related to such classes. In particular, we consider formalisms for nonmonotonic reasoning, such as circumscription and default logic, as well as belief revision operators and the stable model semantics for logic programs with negation. One interesting result is that formalisms with the same time complexity do not necessarily belong to the same space efficiency class

    Semantic Matchmaking as Non-Monotonic Reasoning: A Description Logic Approach

    Full text link
    Matchmaking arises when supply and demand meet in an electronic marketplace, or when agents search for a web service to perform some task, or even when recruiting agencies match curricula and job profiles. In such open environments, the objective of a matchmaking process is to discover best available offers to a given request. We address the problem of matchmaking from a knowledge representation perspective, with a formalization based on Description Logics. We devise Concept Abduction and Concept Contraction as non-monotonic inferences in Description Logics suitable for modeling matchmaking in a logical framework, and prove some related complexity results. We also present reasonable algorithms for semantic matchmaking based on the devised inferences, and prove that they obey to some commonsense properties. Finally, we report on the implementation of the proposed matchmaking framework, which has been used both as a mediator in e-marketplaces and for semantic web services discovery

    A minimal Beta Beam with high-Q ions to address CP violation in the leptonic sector

    Get PDF
    In this paper we consider a Beta Beam setup that tries to leverage at most existing European facilities: i.e. a setup that takes advantage of facilities at CERN to boost high-Q ions (8Li and 8B) aiming at a far detector located at L = 732 Km in the Gran Sasso Underground Laboratory. The average neutrino energy for 8Li and 8B ions boosted at \gamma ~ 100 is in the range E_\nu = [1,2] GeV, high enough to use a large iron detector of the MINOS type at the far site. We perform, then, a study of the neutrino and antineutrino fluxes needed to measure a CP-violating phase delta in a significant part of the parameter space. In particular, for theta_13 > 3 deg, if an antineutrino flux of 3 10^19 useful 8Li decays per year is achievable, we find that delta can be measured in 60% of the parameter space with 6 10^18 useful 8B decays per year.Comment: 19 pages, 10 figures, added references and corrected typo

    A Beta Beam complex based on the machine upgrades for the LHC

    Get PDF
    The Beta Beam CERN design is based on the present LHC injection complex and its physics reach is mainly limited by the maximum rigidity of the SPS. In fact, some of the scenarios for the machine upgrades of the LHC, particularly the construction of a fast cycling 1 TeV injector (``Super-SPS''), are very synergic with the construction of a higher γ\gamma Beta Beam. At the energies that can be reached by this machine, we demonstrate that dense calorimeters can already be used for the detection of ν\nu at the far location. Even at moderate masses (40 kton) as the ones imposed by the use of existing underground halls at Gran Sasso, the CP reach is very large for any value of θ13\theta_{13} that would provide evidence of νe\nu_e appearance at T2K or NOν\nuA (θ133\theta_{13}\geq 3^\circ). Exploitation of matter effects at the CERN to Gran Sasso distance provides sensitivity to the neutrino mass hierarchy in significant areas of the θ13δ\theta_{13}-\delta plane

    Precision on leptonic mixing parameters at future neutrino oscillation experiments

    Get PDF
    We perform a comparison of the different future neutrino oscillation experiments based on the achievable precision in the determination of the fundamental parameters theta_{13} and the CP phase, delta, assuming that theta_{13} is in the range indicated by the recent Daya Bay measurement. We study the non-trivial dependence of the error on delta on its true value. When matter effects are small, the largest error is found at the points where CP violation is maximal, and the smallest at the CP conserving points. The situation is different when matter effects are sizable. As a result of this effect, the comparison of the physics reach of different experiments on the basis of the CP discovery potential, as usually done, can be misleading. We have compared various proposed super-beam, beta-beam and neutrino factory setups on the basis of the relative precision of theta_{13} and the error on delta. Neutrino factories, both high-energy or low-energy, outperform alternative beam technologies. An ultimate precision on theta_{13} below 3% and an error on delta of < 7^{\circ} at 1 sigma (1 d.o.f.) can be obtained at a neutrino factory.Comment: Minor changes, matches version accepted in JHEP. 30 pages, 9 figure

    Assessment of body composition in health and disease using bioelectrical impedance analysis (bia) and dual energy x-ray absorptiometry (dxa): A critical overview

    Get PDF
    The measurement of body composition (BC) represents a valuable tool to assess nutritional status in health and disease. The most used methods to evaluate BC in the clinical practice are based on bicompartment models and measure, directly or indirectly, fat mass (FM) and fat-free mass (FFM). Bioelectrical impedance analysis (BIA) and dual energy X-ray absorptiometry (DXA) (nowadays considered as the reference technique in clinical practice) are extensively used in epidemiological (mainly BIA) and clinical (mainly DXA) settings to evaluate BC. DXA is primarily used for the measurements of bone mineral content (BMC) and density to assess bone health and diagnose osteoporosis in defined anatomical regions (femur and spine). However, total body DXA scans are used to derive a three-compartment BC model, including BMC, FM, and FFM. Both these methods feature some limitations: the accuracy of BIA measurements is reduced when specific predictive equations and standardized measurement protocols are not utilized whereas the limitations of DXA are the safety of repeated measurements (no more than two body scans per year are currently advised), cost, and technical expertise. This review aims to provide useful insights mostly into the use of BC methods in prevention and clinical practice (ambulatory or bedridden patients). We believe that it will stimulate a discussion on the topic and reinvigorate the crucial role of BC evaluation in diagnostic and clinical investigation protocols
    corecore